8 - Diagnostic Medical Image Processing [ID:10383]
50 von 603 angezeigt

Good morning.

Let's consider the storyline of diagnostic medical image processing.

As usual, Monday morning, we just reconsider what we have discussed so far, the big picture.

So we learned a little bit about modalities that are used in medical imaging.

And currently we are in one chapter on pre-processing where we are looking at artifacts that are

implied by the acquisition device and we consider correction methods that improve the images

on their way from the sensor to the monitor.

And we have seen X-ray imaging the sensor.

We considered first was the image intensifier and we looked into a class of algorithms for

image undistortion.

The undistortion mapping was calibrated using a calibration pattern and a least square estimator.

So these are the mathematical techniques we have seen so far.

So SVD and least square estimator.

And then we considered the flat panel technology and the problem of defect pixel interpolation.

Defect pixel interpolation.

And in this context we talked about Fourier transform and associated algorithms.

So interpolation in spatial domain and interpolation in the frequency domain.

And last week we started to look into MRI, so magnetic resonance imaging.

And in MRI we have the problem that we get in a bias field due to the fact that we have

inhomogeneities of the magnetic field implied by patient position, the used coils.

And the bias field correction has to be done in a way that we can eliminate a low frequency

bias field that is multiplicatively or additively corrupting the original MRI image.

And in this context we talked about Fourier methods, low pass filtering combined with

high pass filter.

But we also introduced last week the important concept.

You cannot read my handwriting?

What is the problem?

Any problem?

No problem.

So we have learned the important concept of KL divergence.

And the KL divergence is like the squared difference we use in least square estimators.

We use the KL divergence to measure the similarity of two probability density functions.

So that is a measure that allows us to consider the similarity of PDFs.

And the definition was KL of P and Q is the integral or the sum in the discrete of P

of X log P of X Q of X DX.

And last week we have also seen that two important concepts that we know from information theory

or at least a few of you know from information theory can be derived out of the definition

of the KL divergence.

One is the entropy.

The entropy that is defined by the entropy H of P that is defined by minus integral log

P of X DX.

The concept of entropy.

We have seen this is nothing else but the KL divergence we get if we compare P to the

uniform density.

You remember that?

And the entropy measures the distance of the probability density function P to the uniform

density.

The higher the entropy, the closer the PDFP is to the uniform density.

And we also have seen the mutual information.

On Thursday the mutual information that is, sorry, you should abbreviate it by mutual

Zugänglich über

Offener Zugang

Dauer

01:17:43 Min

Aufnahmedatum

2014-11-03

Hochgeladen am

2019-04-09 20:49:02

Sprache

en-US

  • Modalitäten der medizinischen Bildgebung
  • akquisitionsspezifische Bildvorverarbeitung

  • 3D-Rekonstruktion

  • Bildregistrierung

 

Einbetten
Wordpress FAU Plugin
iFrame
Teilen